18 research outputs found

    An MIP Approach to the U-line Balancing Problem With Proportional Worker Throughput

    Get PDF
    One of the major challenges faced by manufacturing companies is to remain competitive in dynamic environments, where fluctuations in customer demand and production rates require systems capable of adapting in a practical and economical way. A U-shaped production cell is considered one of the most flexible designs for adapting the workforce level to varying conditions. However, re-balancing efforts are time consuming and often require a new work allocation and line design. In this paper, a two-stage MIP model to determine the best cell design under varying workforce levels is proposed. The model seeks to maintain proportionality between throughput and the number of workers. Computational experiments considering various line configurations (up to 19 stations) and workloads (up to 79 tasks) are performed. The results show the proposed algorithm provides excellent results for all small and medium size problems addressed in this study, as well as for certain configurations of large problems. This approach can be used to generate lookup tables of line designs to help with quick reallocation of worker assignments on the shop floor and with minimal disruption

    Evaluating Threat Assessment for Multi-Stage Cyber Attacks

    Get PDF
    Current practices to defend against cyber attacks are typically reactive yet passive. Recent research work has been proposed to proactively predict hacker\u27s target entities in the early stage of the attack. With prediction, there comes false alarms and missed attacks. Very little has been reported on how to evaluate a threat assessment algorithm, especially for cyber security. Because of the variety and the constantly changing nature of hacker behavior and network vulnerabilities, a cyber threat assessment algorithm is, perhaps more susceptible that for other application domains. This work sets forth the issues on evaluating cyber threat assessment algorithms, and discusses the validity of various statistical measures. Simulation examples are provided to illustrate the pros and cons of using different metrics under various cyber attack scenarios. Our results show that commonly used false positives and false negatives are necessary but not sufficient to evaluate cyber threat assessment

    TANDI: Threat Assessment of Network Data and Information

    Get PDF
    Current practice for combating cyber attacks typically use Intrusion Detection Sensors (IDSs) to passively detect and block multi-stage attacks. This work leverages Level-2 fusion that correlates IDS alerts belonging to the same attacker, and proposes a threat assessment algorithm to predict potential future attacker actions. The algorithm, TANDI, reduces the problem complexity by separating the models of the attacker\u27s capability and opportunity, and fuse the two to determine the attacker\u27s intent. Unlike traditional Bayesian-based approaches, which require assigning a large number of edge probabilities, the proposed Level-3 fusion procedure uses only 4 parameters. TANDI has been implemented and tested with randomly created attack sequences. The results demonstrate that TANDI predicts future attack actions accurately as long as the attack is not part of a coordinated attack and contains no insider threats. In the presence of abnormal attack events, TANDI will alarm the network analyst for further analysis. The attempt to evaluate a threat assessment algorithm via simulation is the first in the literature, and shall open up a new avenue in the area of high level fusion

    Paroids: A generic environment for local search

    No full text
    Perhaps the most important development of the past decade in discrete optimization has been the emergence of a coherent complexity theory providing formal definitions of the classes of problems tractable to different kinds of algorithms. The theory has isolated a vast problem family, NP-Complete, for which is widely accepted that no formally efficient algorithm can be produced for any of its members. This conjecture has renewed mathematical interest in the heuristic/approximate approaches long used in an ad hoc way to tackle hard combinatorial problems. To date, most research on approximate combinatorial algorithms has been either very problem specific or if generic, rooted in linear programming. This research is directed to an alternative approach. The goal is to open the door to truly generic research in combinatorial heuristics by isolating and proving the viability of a canonical combinatorial environment in which heuristics can be structured, compared and applied to numerous specific models. We will define a new matroid based combinatorial structure called paroid. A number of classes of paroids are introduced, and their relation to classical models is shown. Structural properties of paroids and their relation to matroids are presented. Two optimization problems that arise from paroids are introduced, and natural reductions of well-known discrete models into the paroid optimization environment are also shown. The models studied are: k-Matroid Intersection, Matching, Traveling Salesman Problem, Vertex Packing, Graph Partitioning, and Knapsack. We also present a local search procedure, called paroid search which generalizes a number of problem specific algorithms. Some of these generalized procedures include Lin Kerninghan for the Traveling Salesman Problem, the greedy for independence systems, and an optimal algorithm for 2-Matroid Intersection. A number of algorithmic properties are shown, including PLS-Completeness and reachability

    Data mining in an engineering design environment: applications from graph matching

    No full text
    Data mining has been making inroads into the engineering design environment – an area that generates large amounts of heterogeneous data for which suitable mining methods are not readily available. For instance, an unsupervised data mining task (clustering) requires an accurate measure of distance or similarity. This paper focuses on the development of an accurate similarity measure for bills of materials (BOM) that can be used to cluster BOMs into product families and subfamilies. The paper presents a new problem called Tree Bundle Matching (TBM) that is identified as a result of the research, gives a non-polynomial formulation, a proof that the problem is NP-Hard, and suggests possible heuristic approaches

    TANDI: Threat Assessment of Network Data and Information

    No full text
    Current practice for combating cyber attacks typically use Intrusion Detection Sensors (IDSs) to passively detect and block multi-stage attacks. This work leverages Level-2 fusion that correlates IDS alerts belonging to the same attacker, and proposes a threat assessment algorithm to predict potential future attacker actions. The algorithm, TANDI, reduces the problem complexity by separating the models of the attacker’s capability and opportunity, and fuse the two to determine the attacker’s intent. Unlike traditional Bayesian-based approaches, which require assigning a large number of edge probabilities, the proposed Level-3 fusion procedure uses only 4 parameters. TANDI has been implemented and tested with randomly created attack sequences. The results demonstrate that TANDI predicts future attack actions accurately as long as the attack is not part of a coordinated attack and contains no insider threats. In the presence of abnormal attack events, TANDI will alarm the network analyst for further analysis. The attempt to evaluate a threat assessment algorithm via simulation is the first in the literature, and shall open up a new avenue in the area of high level fusion

    AUTOMATING BATTLEFIELD EVENT REPORTING USING CONCEPTUAL SPACES AND FUZZY LOGIC FOR PASSIVE SPEECH INTERPRETATION

    No full text
    This research explores the feasibility of performing passive information capture on voice data in order to analyze and classify the contents of interpersonal communication. The general form of this problem is very difficult as fully automated speech understanding technology does not exist. This is further complicated by battlefield realities including: noise, jargon and unstructured speech. However, when specific topics are isolated for extraction, the challenge becomes manageable. Conceptual Spaces is used as a fusion framework to classify data passively captured by traditional speech recognition software coupled with fuzzy logic to provide matching of phonetics to jargon. Together these technologies prove to be a valuable fusion framework because of their ability to mitigate the high levels of errors inherent in speech recognition. An initial study focused on recognizing important topics in communications between commanders and field personnel amidst background chatter. Results indicate the Conceptual Spaces model is flexible enough to define “spaces ” for military events, and the underlying optimization model used for classification was robust and fast enough to quickly and accurately classify the noisy scenario data. This technology enables a new and more general class of automation, permitting conversion of passive speech into structured data. The authors gratefully acknowledge the support provided by the Defense Advanced Research Projects Agency (DARPA)
    corecore